Pushing RTX

Now that we have the basic understanding of raytracing and how it works within nkGraphics, it is time to make an effect hardly reachable with rasterization. During this tutorial, we will see how we can push raytracing a bit further to use different resources, and how nkGraphics allows us to do so by reusing the same API as for past tutorials !

During this tutorial, we will use raytracing to get back to the result we had with rasterization : an environment map with a sphere reflecting it. However, we will introduce a second sphere to make things spicier ! Indeed, with default rasterization, both sphere would not be able to easily reflect each other... But this is something we will get right away with raytracing !

Here is the result we will get after going through this tutorial :

Result
Tutorial result

Let's dig in without waiting !

Updating the new raygen / miss shader

Starting from last tutorial, what we will need is a more complex shader when missing geometry. This will be what will be painting our background : when we don't hit any geometry, simply sample the environment map we used up till now. Let's see the updated Program :

struct RayPayload { float4 color ; float depth ; } ; cbuffer PassConstants : register(b0) { uint4 texInfos ; float4 camPos ; matrix invView ; matrix invProj ; } RaytracingAccelerationStructure scene : register(t0) ; RWStructuredBuffer<float4> output : register(u0) ; // This time, we will add the environment map and a sampler to be able to use it in one or more stages TextureCube envMap : register(t1) ; SamplerState envSampler : register(s0) ; [shader("raygeneration")] void raygen () { // Compute Ray's origin, as simple units float2 dispatchIndex = DispatchRaysIndex().xy ; float2 pixCenter = dispatchIndex.xy + 0.5 ; float2 uvs = pixCenter / texInfos.xy * 2.0 - 1.0 ; uvs.y = -uvs.y ; float3 pixelOrigin = camPos.xyz ; float4 pixelDir = mul(invView, mul(invProj, float4(uvs, 0, 1))) ; pixelDir.xyz /= pixelDir.w ; float3 pixelDirVec3 = normalize(pixelDir.xyz - pixelOrigin) ; // Trace the ray RayDesc ray ; ray.Origin = pixelOrigin ; ray.Direction = pixelDirVec3 ; ray.TMin = 0.001 ; ray.TMax = 100.0 ; RayPayload payload = {float4(1, 1, 1, 1), 1} ; // Simple tracing TraceRay(scene, RAY_FLAG_NONE, ~0, 0, 1, 0, ray, payload) ; // And writing uint index = dispatchIndex.y * texInfos.x + dispatchIndex.x ; output[index] = float4(payload.color.xyz, 1) ; } [shader("miss")] void miss (inout RayPayload payload) { // When we miss a geometry, we will sample the environment map to make a nice background // Notice that we need to specify the mip level to sample : consider raytracing programs as compute programs, Sample is unavailable payload.color = envMap.SampleLevel(envSampler, normalize(WorldRayDirection()), 0) ; }

The raygen program is left untouched. However, the miss program will now sample the environment map, and thus, we add the TextureCube and SamplerState declarations. It uses the provided WorldRayDirection function to retrieve ray's direction in world space and sample from it. Note that we cannot use the Sample function in a raytracing stage. This is because like compute stages, we don't work with pixel groups, and thus DirectX cannot derive a mip level to use for a given pixel.

Now that we have a new texture and a new sampler used, we need to feed them from the Shader. From what we had in last tutorial, we only need to add :

nkGraphics::Texture* tex = nkGraphics::TextureManager::getInstance()->get("tex") ; raygenMissShader->addTexture(tex, 1) ; nkGraphics::Sampler* sampler = nkGraphics::SamplerManager::getInstance()->get("sampler") ; raygenMissShader->addSampler(sampler, 0) ;

Which is basically how we would do it for any shader, by linking a resource to its slot.

With this, our raygen / miss shader is ready to go ! If we were to launch the program now, we would already see the background has changed to our environment map. Now to work with the geometry itself !

Updating the hit program

As we mentioned, we want to add shiny reflections on the spheres that will use the shader. Let's update the program accordingly :

struct RayPayload { float4 color ; float depth ; } ; // Vertex data composition // Needs to be aligned on the mesh layout itself struct VertexData { float3 position ; float2 uvs ; float3 normal ; } ; RaytracingAccelerationStructure scene : register(t0) ; TextureCube envMap : register(t1) ; // Mesh data, naming is important as it is how nkGraphics can get the slots back StructuredBuffer<VertexData> _vertexData : register(t2) ; SamplerState envSampler : register(s0) ; [shader("closesthit")] void closestHit (inout RayPayload payload, in BuiltInTriangleIntersectionAttributes attr) { // Compute barycentrics float3 bary = float3(1.0 - attr.barycentrics.x - attr.barycentrics.y, attr.barycentrics.x, attr.barycentrics.y) ; uint primitiveIndex = PrimitiveIndex() ; // Find back all points constituting triangle float3 a = _vertexData[primitiveIndex * 3 + 0].normal ; float3 b = _vertexData[primitiveIndex * 3 + 1].normal ; float3 c = _vertexData[primitiveIndex * 3 + 2].normal ; // Recompute from barycenters in triangle float3 hitPosition = WorldRayOrigin() + WorldRayDirection() * RayTCurrent() ; float3 foundNormal = normalize(a * bary.x + b * bary.y + c * bary.z) ; float3 worldDir = normalize(WorldRayDirection()) ; // Prepare new ray to fire RayDesc ray ; ray.Origin = hitPosition ; ray.Direction = normalize(reflect(worldDir, foundNormal)) ; ray.TMin = 0.001 ; ray.TMax = 100.0 ; // Update payload through a new tracing if (payload.depth < 6) { payload.depth += 1.0 ; TraceRay(scene, RAY_FLAG_NONE, ~0, 0, 1, 0, ray, payload) ; } else payload.color = envMap.SampleLevel(envSampler, worldDir, 0) ; }

Here we are introducing quite some new resources / data structure changes. Let's go through them in order :

  1. While not mentioned earlier, the ray payload has been updated to also have a depth member. This is because we need to give a RaytracingPass the maximum recursion depth (consecutive ray relaunching), which we will detail in a later step. However, this will be of use in the hit stage.
  2. We added the acceleration structure to the resource it can access : an AS can be fed to any shader that can read it.
  3. We also fed the environment map, and the sampler associated.
  4. As we need to know how to reflect our rays, we also need to have the geometry available to get the normal from a triangle.

This last point is quite special, as this is something we don't really link in any shader slot. For a given mesh that is part of a raytraced render queue, nkGraphics will link the geometry attached to it during a hit. If you need to access such data, you need to name your slots _vertexData and _indexData respectively. The slots need to be texture slots, with no constraint on the index, which will be detected when the program gets compiled.

This information is accessed through StructuredBuffer structures. For indices, they come as triangle indices, thus as uint3. However, the vertex data is a little bit more involved to get right.

Currently, there is no way for nkGraphics to know how the mehes are laid out. This means you need yourself to get the structures right, with the right attributes types and ordering. Else, you might end up sampling data incorrectly, as the offsets will be wrong. A consequence is that one hit program can only be linked to one geometry layout type, as there is no way of changing how a buffer is interpreted within the same function.

As of now, this is the biggest constraint when accessing geometry data, as there is no pipeline stage feeding the data layout for us. Here, we have a sphere mesh presenting positions, uvs, and normals. The ordering is made like the one described in the hlsl, interleaved. Correctly aligning this ensures we are correctly sampling the normals in the function.

Talking about the function, the way to go to sample geometry is by using the barycentric informations from the intersection, provided by the hit stage. Using this, we can recompute the way we need to interpolate between a triangle's 3 points attributes, that we will find using the PrimitiveIndex function offered to us. From there, we can reconstruct the geometry normal and the intersection position, which enables us to reflect our ray.

Now the magic part about raytracing begins ! We can now reconstruct a new ray, starting from our intersection, going in the reflected direction. We can then trace the scene again with this new ray, simply forwarding the payload so that it gets altered deeper in the recursion.

Note that we are tracking the recursion depth of a ray, and preventing a new ray from being launched if we are hitting a certain threshold. This is to prevent any driver crash from happening : going past the max recursion depth specified in a pass will at best crash the driver and application. Because DXR allocates as little resource as possible when pre-building everything, going further than that might lead it to read / write on unwanted memory regions, causing device removals. We will see how we can alter this max recursion depth when working on composition.

This concludes the big changes on the hit function, let's see what we need to add in to the Shader :

nkGraphics::RenderQueue* rq = nkGraphics::RenderQueueManager::getInstance()->get(nkGraphics::RenderQueueManager::DEFAULT_RENDER_QUEUE) ; hitShader->addTexture(rq->getAccelerationStructureBuffer(), 0) ; // Give it related texture info nkGraphics::Texture* envMap = nkGraphics::TextureManager::getInstance()->get("tex") ; hitShader->addTexture(envMap, 1) ; nkGraphics::Sampler* envSampler = nkGraphics::SamplerManager::getInstance()->get("sampler") ; hitShader->addSampler(envSampler, 0) ;

As mentioned earlier :

Now that our shaders are ready, let's see what we need to update in our composition.

Updating composition : the recursion depth

As mentioned in the hit program update, we are now recursively tracing rays. This requires us to be aware of what we are doing in there : as mentioned, getting it badly will crash the driver, as it will most probably read from unwanted memory addresses.

rtPass->setMaxTraceRecursionDepth(6) ;

When setting up the RaytracingPass, it is possible to override the max recursion depth you expect to have. Here, we will work with 6 bounces, as this should be sufficient to get good fidelity in the scene. But remember the security we implemented in the hit function : this is because theoretically, we could have many more bounces between both spheres, depending on the angle of the ray. As such, we needed to find a good balance between what could be done and what we want.

Remember that this should be kept at a minimum possible, for better performances.

Updating the scene

After having overridden the max recursion depth, what is left is adding the second sphere to the scene :

shader = nkGraphics::ShaderManager::getInstance()->get("reflectionHit") ; ent = rq->addEntity() ; ent->setRaytracingShader(shader) ; nkGraphics::Mesh* sphere = nkGraphics::MeshManager::getInstance()->get("Mesh") ; nkGraphics::SubEntity* subEnt = ent->addChild() ; subEnt->setMesh(sphere) ; nkGraphics::Node* node = nkGraphics::NodeManager::getInstance()->create() ; node->setPositionAbsolute(nkMaths::Vector(0.f, 5.f, 0.f)) ; ent->setParentNode(node) ;

The usual business when working with render queues and graph nodes ! With the difference, of course, that we need to give the raytracing shader, for our RaytracingPass.

One final detail that won't be detailed here is that we also make this new sphere move around the original sphere, through :

loopPos = nkMaths::Vector(std::cos(currentLoopT) * 3.f, std::sin(currentLoopT) * 3.f, 0.f) ; loopPos += sphereCenter ; sphereNode->setPositionAbsolute(loopPos) ;

Done each frame in the custom rendering loop we wrote in the 4th tutorial. Note that there is no special trick again : update the node, like we would do with rasterized passes. This will trigger the acceleration structures updates for us.

Beware though : the update is asynchronous. This means that if you need to move something for a static image that will get generated right after, you will need to flush rendering through Renderer::flushRendering().

Result

We can now test the program with all the changes :

Result
Shiny spheres moving !

Notice how both spheres reflect each other, in real time ! This is a tricky effect to reach with pure rasterization, yet with raytracing we naturally get it by recursively tracing the scene on an intersection. Perfectly showcasing why raytracing is a big thing !

Conclusion

Through this tutorial, we saw how to push raytracing to witness how powerful it can be. We also saw how nkGraphics allows us to do so, without requesting much changes to what we already know from the rasterization tutorials we had before.

Now we know about :

That makes a lot of points not really new, in fact. And now that we know about all of that, the only thing remaining is using this new knowledge to build always more powerful effects !